40 research outputs found

    Trapped in the Digital Divide: The Distributive Paradigm in Community Informatics

    Get PDF
    This paper argues that over-reliance on a distributive paradigm in community informatics practice has restricted the scope of the high tech equity agenda. Drawing on five years of participatory action research with low-income women in upstate New York, I explore the ways in which distributive understandings of technology and inequality obscure the day-to-day interactions these women have with ICT and overlook their justified critical ambivalence towards technology. Finally, I offer unique insights and powerful strategies of resistance suggested by my research collaborators in a drawing exercise intended to elicit alternative articulations of digital equity. If we begin from their points of view, the problems and solutions that can and should be entertained in our scholarship and practice look quite different

    Reclaiming our data: interim report, Detroit

    Get PDF
    Our Data Bodies (ODB) is a research justice project that examines the impact of data collection and data-driven systems on the ability of marginalized peoples to meet their human needs. For the past three years, ODB has been working in three cities—Charlotte, North Carolina; Detroit, Michigan; and Los Angeles California. To date, we have completed nearly 135 in-depth interviews with residents of these cities’ most historically marginalized neighborhoods. Our project combines community-based organizing, capacity-building, and rigorous academic research. Through collective community building and analysis of the stories we've collected, the ODB project has identified many similarities in how people across the three cities experience data collection and data-driven systems. Patterns have emerged like insecurity and targeting, resignation and resistance, the separation of family—whether through incarceration, detention, deportation, or foster care systems— and speak to the way that individuals are forced to trade away their data to attain basic human needs. Our community members describe the experience of being forced to engage with intrusive and unsecure data-driven systems because of their membership in groups that have historically faced exploitation, discrimination, predation, and other forms of structural violence. They’ve shared with us their experience of being caught in a cycle of injustice and the impact this feedback loop has on their livelihoods. Our interviewees also tell a story of wanting both privacy and the ability to be seen and heard as whole human beings. Charlotteans, Detroiters, and Angelinos are resilient in spite of persistent and destructive forms of surveillance and profiling. They believe in their humanity, value human relationships, and want respect and recognition in and beyond the systems of data collection that try to govern their lives. Overall, surveillance and data collection are deeply connected to diversion from public benefits, insecure housing, loss of job opportunities, and the policing and criminalization of our communities. Whether an error or something they have overcome, people’s data stays with them—far longer than more advantaged groups, and its impacts are profound

    A Case for Humans-in-the-Loop: Decisions in the Presence of Erroneous Algorithmic Scores

    Full text link
    The increased use of algorithmic predictions in sensitive domains has been accompanied by both enthusiasm and concern. To understand the opportunities and risks of these technologies, it is key to study how experts alter their decisions when using such tools. In this paper, we study the adoption of an algorithmic tool used to assist child maltreatment hotline screening decisions. We focus on the question: Are humans capable of identifying cases in which the machine is wrong, and of overriding those recommendations? We first show that humans do alter their behavior when the tool is deployed. Then, we show that humans are less likely to adhere to the machine's recommendation when the score displayed is an incorrect estimate of risk, even when overriding the recommendation requires supervisory approval. These results highlight the risks of full automation and the importance of designing decision pipelines that provide humans with autonomy.Comment: Accepted at ACM Conference on Human Factors in Computing Systems (ACM CHI), 202

    Infrastructural Speculations: Tactics for Designing and Interrogating Lifeworlds

    Get PDF
    This paper introduces “infrastructural speculations,” an orientation toward speculative design that considers the complex and long-lived relationships of technologies with broader systems, beyond moments of immediate invention and design. As modes of speculation are increasingly used to interrogate questions of broad societal concern, it is pertinent to develop an orientation that foregrounds the “lifeworld” of artifacts—the social, perceptual, and political environment in which they exist. While speculative designs often imply a lifeworld, infrastructural speculations place lifeworlds at the center of design concern, calling attention to the cultural, regulatory, environmental, and repair conditions that enable and surround particular future visions. By articulating connections and affinities between speculative design and infrastructure studies research, we contribute a set of design tactics for producing infrastructural speculations. These tactics help design researchers interrogate the complex and ongoing entanglements among technologies, institutions, practices, and systems of power when gauging the stakes of alternate lifeworlds

    The Biometric Assemblage: Surveillance, Experimentation, Profit, and the Measuring of Refugee Bodies

    Get PDF
    Biometric technologies are routinely used in the response to refugee crises with the United Nations High Commissioner for Refugees (UNHCR) aiming to have all refugee data from across the world in a central population registry by the end of 2019. The article analyses biometrics, AI and blockchain as part of a technological assemblage, which I term the biometric assemblage. The article identifies five intersecting logics which explain wider transformations within the humanitarian sector and in turn shape the biometric assemblage. The acceleration of the rate of biometric registrations in the humanitarian sector between 2002 and 2019 reveals serious concerns regarding bias, data safeguards, data-sharing practices with states and commercial companies, experimentation with untested technologies among vulnerable people, and, finally, ethics. Technological convergence amplifies risks associated with each constituent technology of the biometric assemblage. The paper finally argues that the biometric assemblage accentuates asymmetries between refugees and humanitarian agencies and ultimately entrenches inequalities in a global context

    Bureaucracy as a Lens for Analyzing and Designing Algorithmic Systems

    Get PDF
    Scholarship on algorithms has drawn on the analogy between algorithmic systems and bureaucracies to diagnose shortcomings in algorithmic decision-making. We extend the analogy further by drawing on Michel Crozier’s theory of bureaucratic organizations to analyze the relationship between algorithmic and human decision-making power. We present algorithms as analogous to impartial bureaucratic rules for controlling action, and argue that discretionary decision-making power in algorithmic systems accumulates at locations where uncertainty about the operation of algorithms persists. This key point of our essay connects with Alkhatib and Bernstein’s theory of ’street-level algorithms’, and highlights that the role of human discretion in algorithmic systems is to accommodate uncertain situations which inflexible algorithms cannot handle. We conclude by discussing how the analysis and design of algorithmic systems could seek to identify and cultivate important sources of uncertainty, to enable the human discretionary work that enhances systemic resilience in the face of algorithmic errors.Peer reviewe

    Decolonizing Privacy studies

    Get PDF
    This paper calls for an epistemic disobedience in privacy studies by decolonizing the approach to privacy. As technology companies expand their reach worldwide, the notion of privacy continues to be viewed through an ethnocentric lens. It disproportionately draws from empirical evidence on Western-based, white, and middle-class demographics. We need to break away from the market-driven neoliberal ideology and the Development paradigm long dictating media studies if we are to foster more inclusive privacy policies. This paper offers a set of propositions to de-naturalize and estrange data from demographic generalizations and cultural assumptions, namely, (1) predicting privacy harms through the history of social practice, (2) recalibrating the core-periphery as evolving and moving targets, and (3) de-exoticizing “natives” by situating privacy in ludic digital cultures. In essence, decolonizing privacy studies is as much an act of reimagining people and place as it is of dismantling essentialisms that are regurgitated through scholarship

    Making Sense of Imbrication: Popular Technology and ‘Inside-Out’ Methodologies. Paper read at Participatory Design Conference, 2004, at

    No full text
    We describe a model popular technology education program based on feminist and Freirian principles. Participatory design and research methodologies that position facilitators and participants as co-producers were the basis for a series of collective research projects, which we then analyze for their contribution to the field of participatory design. Finally, we suggest that the democratization of technological citizenship can be best extended not through narrowly construed “technology training ” programs but through “popular technology, ” an empowering and visionary combination of popular education and participatory research and design that emphasizes critical technological literacy. Categories and Subject Descriptors K.4.2 [Social Issues]: Technology literacy and social justice – computer literacy, collaborative design methodology, participatory design, digital equity
    corecore